52 research outputs found

    Dynamic criticality assessment as a supporting tool for knowledge retention to increase the efficiency and effectiveness of maintenance

    Get PDF
    Digitalisation offers industrial companies a multitude of opportunities and new technologies (e.g. Big Data Analytics, Cloud Computing, Internet of Things), but it still poses a great challenge for them. Especially the choice of the maintenance strategy, the increasing complexity and level of automation of assets and asset components have a decisive influence. Due to technological progress and the new possibilities offered by industry 4.0, the interaction of different systems and assets is essential to increase the efficiency of the maintenance processes within the value-added chain and to guarantee flexibility permanently. These factors lead to an increased importance of a process methodology for a dynamic evaluation of the asset’s condition over the entire life cycle and under changing framework and production conditions. Therefore, legal and environmentally relevant requirements are considered, based on the procedure of HAZOP (IEC 61882), and ensure the traceability of the results and systematically record the asset’s knowledge gained this way so that it is not tied to individual employees, as it is currently the case. The criticality assessment as a basic component of Lean Smart Maintenance, the dynamic learning, and knowledge-oriented maintenance, offers such a holistic, value-added oriented approach. A targeted optimization of the maintenance strategy is possible through automated evaluation of the assets and identification of the most critical ones based on company-specific criteria derived from the success factors of the company and considering all three management levels normative, strategic and operational. By considering the resource knowledge in the maintenance-strategy optimization process and using suitable methodologies of knowledge management based on the prevailing data quality for it, the efficiency and effectiveness are permanently guaranteed in the sense of continuous improvement

    Value Stream Mapping and Process Mining: A Lean Method Supported by Data Analytics

    Get PDF
    The analysis and further the reorganization of an order-production is a typical scope of application of a value stream mapping. Value stream mapping is a lean-management method to map the current state of a series of processes that are necessary to manufacture a product or provide a service. The ever-increasing digitalization emphasizes the importance of the information flow. Besides the material flow, the information flow is the second focus of value stream mapping. It is needed to get valuable insights into the production process by applying state-of-the-art data analytics methods. Process Mining is a possible method to analyse (business) process and process sequences based on event logs. This paper illustrates a method of combining conventional value stream mapping and Process Mining. While value stream mapping shows the material and information flow as the people think it is, Process Mining shows how they are, based on the data fingerprint. The comparison of the two outcomes allows conclusions for the following value stream design with a special focus on the use of data for Industry 4.0 applications. The application of both methods to an order-production enables to present results and compare both as well as present advantages and disadvantages

    Evolution of a Lean Smart Maintenance Maturity Model towards the new Age of Industry 4.0

    Get PDF
    Over the last few years, the complexity of asset and maintenance management of industrial plants and machinery in the producing industry has risen due to higher competition and volatile environments. Smart factories, Internet of Things (IoT) and the underlying digitisation of a significant number of processes are changing the way we have to think and work in terms of asset management. Existing Lean Smart Maintenance (LSM) philosophy, which focuses on the cost-efficient (lean) and the learning organisation (smart) perspectives enables a value-oriented, dynamic, and smart maintenance/asset management. The associated LSM maturity model is the evaluation tool that contains the normative, strategic, and operational aspects of industrial asset management, based on which numerous reorganisation projects have already been carried out in industrial companies. However, due to the ever-increasing development of Industry 4.0 (I4.0), it is necessary to extend the model by selected aspects of digitisation and digitalisation. Based on a structured literature review (SLR) of state of the art I4.0 maturity models, we were able to investigate the essential maturity items for I4.0. To restructure and expand the existing LSM maturity model, the principle of design science research (DSR) was used. The architecture of the LSM maturity model was based on the structure of the Capability Maturity Model Integration (CMMI). Further development of a Lean Smart Maintenance maturity model thus covers the future requirements of I4.0 and data science. It was possible to enhance existing categories with new artefacts from the I4.0 range to represent the influence of cyber-physical systems (CPS), (big) data and information management, condition monitoring (CM) and more. Furthermore, the originally defined LSM-Model was restructured for a more simplified application in industrial use cases

    Validation of a Lean Smart Maintenance Maturity Model

    Get PDF
    Rising complexity in industrial asset and maintenance management due to more volatile business environments and megatrends like Industry 4.0 has led to the need for a new perspective on these management domains. The Lean Smart Maintenance (LSM) philosophy, which focuses on both the efficient (lean) and the learning (smart) organization was introduced during the past few years, and a corresponding maturity model (MM) has been developed to guide organizations on their way to asset and maintenance excellence. This paper discusses use cases, in which the usability and the generic aspect of the LSM MM are validated by using data from three different asset management assessment projects in organizations with different types of production. Research results show that the LSM MM can be used as a basis for management system improvement, independent of production types such as one-of-a-kind industry, mass production and continuous production

    Development of a Generic Framework to Assess Asset Management Maturity within Organisations

    Get PDF
    With the comprehensive Lean Smart Maintenance philosophy and its associated maturity model, organisations were given a tool to reach asset and maintenance excellence. This paper discusses the approach used to transfer the scientifically based methods and concepts of the Lean Smart Maintenance Maturity Model into an assessment structure to generate a generic tool to collect the complete and correct information necessary to determine an organisation\u27s maturity level. Research results show that a standardised assessment process combined with continuous improvement cycles, a more accurate assessment of the company\u27s maturity is possible. A well-structured MM assessment supports less experienced assessors whereby experienced assessors will not need a full questionnaire but only a well-structured list of items and their maturity levels

    A Knowledge-Based Digital Lifecycle-Oriented Asset Optimisation

    Get PDF
    The digitalisation of the value chain promotes sophisticated virtual product models known as digital twins (DT) in all asset-life-cycle (ALC) phases. These models. however, fail on representing the entire phases of asset-life-cycle (ALC), and do not allow continuous life-cycle-costing (LCC). Hence, energy efficiency and resource optimisation across the entire circular value chain is neglected. This paper demonstrates how ALC optimisation can be achieved by incorporating all product life-cycle phases through the use of a RAMS²-toolbox and the generation of a knowledge-based DT. The benefits of the developed model are demonstrated in a simulation, considering RAMS2 (Reliability, Availability, Maintainability, Safety and Sustainability) and the linking of heterogeneous data, with the help of a dynamic Bayesian network (DBN)

    scRNA-Seq of Cultured Human Amniotic Fluid from Fetuses with Spina Bifida Reveals the Origin and Heterogeneity of the Cellular Content

    Full text link
    Amniotic fluid has been proposed as an easily available source of cells for numerous applications in regenerative medicine and tissue engineering. The use of amniotic fluid cells in biomedical applications necessitates their unequivocal characterization; however, the exact cellular composition of amniotic fluid and the precise tissue origins of these cells remain largely unclear. Using cells cultured from the human amniotic fluid of fetuses with spina bifida aperta and of a healthy fetus, we performed single-cell RNA sequencing to characterize the tissue origin and marker expression of cultured amniotic fluid cells at the single-cell level. Our analysis revealed nine different cell types of stromal, epithelial and immune cell phenotypes, and from various fetal tissue origins, demonstrating the heterogeneity of the cultured amniotic fluid cell population at a single-cell resolution. It also identified cell types of neural origin in amniotic fluid from fetuses with spina bifida aperta. Our data provide a comprehensive list of markers for the characterization of the various progenitor and terminally differentiated cell types in cultured amniotic fluid. This study highlights the relevance of single-cell analysis approaches for the characterization of amniotic fluid cells in order to harness their full potential in biomedical research and clinical applications

    Search for dark matter produced in association with bottom or top quarks in √s = 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for weakly interacting massive particle dark matter produced in association with bottom or top quarks is presented. Final states containing third-generation quarks and miss- ing transverse momentum are considered. The analysis uses 36.1 fb−1 of proton–proton collision data recorded by the ATLAS experiment at √s = 13 TeV in 2015 and 2016. No significant excess of events above the estimated backgrounds is observed. The results are in- terpreted in the framework of simplified models of spin-0 dark-matter mediators. For colour- neutral spin-0 mediators produced in association with top quarks and decaying into a pair of dark-matter particles, mediator masses below 50 GeV are excluded assuming a dark-matter candidate mass of 1 GeV and unitary couplings. For scalar and pseudoscalar mediators produced in association with bottom quarks, the search sets limits on the production cross- section of 300 times the predicted rate for mediators with masses between 10 and 50 GeV and assuming a dark-matter mass of 1 GeV and unitary coupling. Constraints on colour- charged scalar simplified models are also presented. Assuming a dark-matter particle mass of 35 GeV, mediator particles with mass below 1.1 TeV are excluded for couplings yielding a dark-matter relic density consistent with measurements

    Search for single production of vector-like quarks decaying into Wb in pp collisions at s=8\sqrt{s} = 8 TeV with the ATLAS detector

    Get PDF
    corecore